10 research outputs found

    A polyominoes-permutations injection and tree-like convex polyominoes

    Get PDF
    AbstractPlane polyominoes are edge-connected sets of cells on the orthogonal lattice Z2, considered identical if their cell sets are equal up to an integral translation. We introduce a novel injection from the set of polyominoes with n cells to the set of permutations of [n], and classify the families of convex polyominoes and tree-like convex polyominoes as classes of permutations that avoid some sets of forbidden patterns. By analyzing the structure of the respective permutations of the family of tree-like convex polyominoes, we are able to find the generating function of the sequence that enumerates this family, conclude that this sequence satisfies the linear recurrence an=6an−1−14an−2+16an−3−9an−4+2an−5, and compute the closed-form formula an=2n+2−(n3−n2+10n+4)/2

    The computational complexity of structure-based causality

    Get PDF
    Halpern and Pearl introduced a definition of actual causality; Eiter and Lukasiewicz showed that computing whether X=x is a cause of Y=y is NP-complete in binary models (where all variables can take on only two values) and\ Sigma_2^P-complete in general models. In the final version of their paper, Halpern and Pearl slightly modified the definition of actual cause, in order to deal with problems pointed by Hopkins and Pearl. As we show, this modification has a nontrivial impact on the complexity of computing actual cause. To characterize the complexity, a new family D_k^P, k= 1, 2, 3, ..., of complexity classes is introduced, which generalizes the class DP introduced by Papadimitriou and Yannakakis (DP is just D_1^P). %joe2 %We show that the complexity of computing causality is \D_2-complete %under the new definition. Chockler and Halpern \citeyear{CH04} extended the We show that the complexity of computing causality under the updated definition is D2PD_2^P-complete. Chockler and Halpern extended the definition of causality by introducing notions of responsibility and blame. The complexity of determining the degree of responsibility and blame using the original definition of causality was completely characterized. Again, we show that changing the definition of causality affects the complexity, and completely characterize it using the updated definition.Comment: Appears in AAAI 201

    Generalized Counterexamples to Liveness Properties

    Get PDF
    Abstract-We consider generalized counterexamples in the context of liveness property checking. A generalized counterexample comprises only a subset of values necessary to establish the existence of a concrete counterexample. While useful in various ways even for safety properties, the length of a generalized liveness counterexample may be exponentially shorter than that of a concrete counterexample, entailing significant potential algorithmic benefits. One application of this concept extends the k-LIVENESS proof technique of [1] to enable failure detection. The resulting algorithm is simple, and poses negligible overhead to k-LIVENESS in practice. We additionally propose dedicated algorithms to search for generalized liveness counterexamples, and to manipulate generalized counterexamples to and from concrete ones. Experiments confirm the capability of these techniques to detect failures more efficiently than existing techniques for various benchmarks

    Counting polycubes without the dimensionality curse

    Get PDF
    Abstractd-dimensional polycubes are the generalization of planar polyominoes to higher dimensions. That is, a d-D polycube of size n is a connected set of n cells of a d-dimensional hypercubic lattice, where connectivity is through (d−1)-dimensional faces of the cells. Computing Ad(n), the number of distinct d-dimensional polycubes of size n, is a long-standing elusive problem in discrete geometry. In a previous work we described the generalization from two to higher dimensions of a polyomino-counting algorithm of Redelmeier [D.H. Redelmeier, Counting polyominoes: Yet another attack, Discrete Math. 36 (1981) 191–203]. The main deficiency of the algorithm is that it keeps the entire set of cells that appear in any possible polycube in memory at all times. Thus, the amount of required memory grows exponentially with the dimension. In this paper we present an improved version of the same method, whose order of memory consumption is a (very low) polynomial in both n and d. We also describe how we parallelized the algorithm and ran it through the Internet on dozens of computers simultaneously

    Designing Reliable Cyber-Physical Systems

    Get PDF
    Cyber-physical systems, that consist of a cyber part a computing System and a physical part the system in the physical environment as well as the respective interfaces between those parts, are omnipresent in our daily lives. The application in the physical environment drives the overall requirements that must be respected when designing the computing system. Here, reliability is a core aspect where some of the most pressing design challenges are: *monitoring failures throughout the computing system, *determining the impact of failures on the application constraints, and *ensuring correctness of the computing system with respect to application-driven requirements rooted in the physical environment. *This chapter gives an overview of the state-of-the-art techniques developed within the Horizon 2020 project IMMORTAL that tackle these challenges throughout the stack of layers of the computing system while tightly coupling the design methodology to the physical requirements. (The chapter is based on the contributions of the special session Designing Reliable Cyber-Physical Systems of the Forum on Specification and Design Languages (FDL) 2016.

    Opportunistic infections and AIDS malignancies early after initiating combination antiretroviral therapy in high-income countries

    No full text
    Background: There is little information on the incidence of AIDS-defining events which have been reported in the literature to be associated with immune reconstitution inflammatory syndrome (IRIS) after combined antiretroviral therapy (cART) initiation. These events include tuberculosis, mycobacterium avium complex (MAC), cytomegalovirus (CMV) retinitis, progressive multifocal leukoencephalopathy (PML), herpes simplex virus (HSV), Kaposi sarcoma, non-Hodgkin lymphoma (NHL), cryptococcosis and candidiasis. Methods: We identified individuals in the HIV-CAUSAL Collaboration, which includes data from six European countries and the US, who were HIV-positive between 1996 and 2013, antiretroviral therapy naive, aged at least 18 years, hadCD4+ cell count and HIV-RNA measurements and had been AIDS-free for at least 1 month between those measurements and the start of follow-up. For each AIDS-defining event, we estimated the hazard ratio for no cART versus less than 3 and at least 3 months since cART initiation, adjusting for time-varying CD4+ cell count and HIV-RNA via inverse probability weighting. Results: Out of 96 562 eligible individuals (78% men) with median (interquantile range) follow-up of 31 [13,65] months, 55 144 initiated cART. The number of cases varied between 898 for tuberculosis and 113 for PML. Compared with non-cART initiation, the hazard ratio (95% confidence intervals) up to 3 months after cART initiation were 1.21 (0.90-1.63) for tuberculosis, 2.61 (1.05-6.49) for MAC, 1.17 (0.34-4.08) for CMV retinitis, 1.18 (0.62-2.26) for PML, 1.21 (0.83-1.75) for HSV, 1.18 (0.87-1.58) for Kaposi sarcoma, 1.56 (0.82-2.95) for NHL, 1.11 (0.56-2.18) for cryptococcosis and 0.77 (0.40-1.49) for candidiasis. Conclusion: With the potential exception of mycobacterial infections, unmasking IRIS does not appear to be a common complication of cART initiation in high-income countries
    corecore